A Filtering Approach to Stochastic Variational Inference

نویسندگان

  • Neil Houlsby
  • David M. Blei
چکیده

Stochastic variational inference (SVI) uses stochastic optimization to scale up Bayesian computation to massive data. We present an alternative perspective on SVI as approximate parallel coordinate ascent. SVI trades-off bias and variance to step close to the unknown true coordinate optimum given by batch variational Bayes (VB). We define a model to automate this process. The model infers the location of the next VB optimum from a sequence of noisy realizations. As a consequence of this construction, we update the variational parameters using Bayes rule, rather than a hand-crafted optimization schedule. When our model is a Kalman filter this procedure can recover the original SVI algorithm and SVI with adaptive steps. We may also encode additional assumptions in the model, such as heavytailed noise. By doing so, our algorithm outperforms the original SVI schedule and a state-of-the-art adaptive SVI algorithm in two diverse domains.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Variational filtering

This note presents a simple Bayesian filtering scheme, using variational calculus, for inference on the hidden states of dynamic systems. Variational filtering is a stochastic scheme that propagates particles over a changing variational energy landscape, such that their sample density approximates the conditional density of hidden and states and inputs. The key innovation, on which variational ...

متن کامل

Adaptively Setting the Learning Rate in Stochastic Variational Inference

Stochastic variational inference is a promising method for fitting large-scale probabilistic models with hidden structures. Different from traditional stochastic learning, stochastic variational inference uses the natural gradient, which is particularly efficient for computing probabilistic distributions. One of the issues in stochastic variational inference is to set an appropriate learning ra...

متن کامل

Quantized Variational Filtering for Bayesian Inference in Wireless Sensor Networks

The primary focus of the chapter is to study the Bayesian inference problem in distributed WSNs with particular emphasis on the trade-off between estimation precision and energy-awareness. We propose a variational approach to approximate the particle distribution to a single Gaussian distribution, while respecting the communication constraints of WSNs. The efficiency of the variational approxim...

متن کامل

Stochastic Variational Inference for Gaussian Process Latent Variable Models using Back Constraints

Gaussian process latent variable models (GPLVMs) are a probabilistic approach to modelling data that employs Gaussian process mapping from latent variables to observations. This paper revisits a recently proposed variational inference technique for GPLVMs and methodologically analyses the optimality and different parameterisations of the variational approximation. We investigate a structured va...

متن کامل

Variational probabilistic inference and the QMR - DT databaseTommi

We describe a variational approximation method for eecient inference in large-scale probabilistic models. Variational methods are deterministic procedures that provide approximations to marginal and conditional probabilities of interest. They provide alternatives to approximate inference methods based on stochastic sampling or search. We describe a variational approach to the problem of diagnos...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2014